A new modified HS algorithm with strong Powell-Wolfe line search for unconstrained optimization
نویسندگان
چکیده
Optimization is now considered a branch of computational science. This ethos seeks to answer the question «what best?» by looking at problems where quality any can be expressed numerically. One most well-known methods for solving nonlinear, unrestricted optimization conjugate gradient (CG) method. The Hestenes and Stiefel (HS-CG) formula one century’s oldest effective formulas. When using an exact line search, HS method achieves global convergence; however, this not guaranteed when inexact search (ILS). Furthermore, does always satisfy descent property. goal work create new (modified) reformulating classic parameter HS-CG adding term formula. It critical that proposed generates sufficient property (SDP) direction with Wolfe-Powell (sWPLS) every iteration, convergence (GCP) general non-convex functions guaranteed. Using sWPLS, modified (mHS-CG) has SDP regardless type guarantees GCP. advantage keeping scalar non-negative sWPLS. paper significant in it quantifies how much better modification performance compared standard methods. As result, numerical experiments between mHSCG sWPL problem show CG more robust than without
منابع مشابه
A Conjugate Gradient Method with Strong Wolfe-Powell Line Search for Unconstrained Optimization
In this paper, a modified conjugate gradient method is presented for solving large-scale unconstrained optimization problems, which possesses the sufficient descent property with Strong Wolfe-Powell line search. A global convergence result was proved when the (SWP) line search was used under some conditions. Computational results for a set consisting of 138 unconstrained optimization test probl...
متن کاملA Line Search Algorithm for Unconstrained Optimization
It is well known that the line search methods play a very important role for optimization problems. In this paper a new line search method is proposed for solving unconstrained optimization. Under weak conditions, this method possesses global convergence and R-linear convergence for nonconvex function and convex function, respectively. Moreover, the given search direction has sufficiently desce...
متن کاملA new hybrid conjugate gradient algorithm for unconstrained optimization
In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, espe...
متن کاملA Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems
In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...
متن کاملA Modified BFGS Algorithm for Unconstrained Optimization
In this paper we present a modified BFGS algorithm for unconstrained optimization. The BFGS algorithm updates an approximate Hessian which satisfies the most recent quasi-Newton equation. The quasi-Newton condition can be interpreted as the interpolation condition that the gradient value of the local quadratic model matches that of the objective function at the previous iterate. Our modified al...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Eastern-European Journal of Enterprise Technologies
سال: 2022
ISSN: ['1729-3774', '1729-4061']
DOI: https://doi.org/10.15587/1729-4061.2022.254017